A numerical study of limited memory BFGS methods
نویسندگان
چکیده
منابع مشابه
A Numerical Study of Limited Memory BFGS
The application of quasi-Newton methods is widespread in numerical optimization. Independently of the application, the techniques used to update the BFGS matrices seem to play an important role in the performance of the overall method. In this paper we address precisely this issue. We compare two implementations of the limited memory BFGS method for large-scale unconstrained problems. They diie...
متن کاملGlobal convergence of online limited memory BFGS
Global convergence of an online (stochastic) limited memory version of the Broyden-FletcherGoldfarb-Shanno (BFGS) quasi-Newton method for solving optimization problems with stochastic objectives that arise in large scale machine learning is established. Lower and upper bounds on the Hessian eigenvalues of the sample functions are shown to suffice to guarantee that the curvature approximation ma...
متن کاملA class of diagonal preconditioners for limited memory BFGS method
A major weakness of the limited memory BFGS (LBFGS) method is that it may converge very slowly on ill-conditioned problems when the identity matrix is used for initialization. Very often, the LBFGS method can adopt a preconditioner on the identity matrix to speed up the convergence. For this purpose, we propose a class of diagonal preconditioners to boost the performance of the LBFGS method. In...
متن کاملLimited Memory Bfgs Updating in a Trust–region Framework
The limited memory BFGS method pioneered by Jorge Nocedal is usually implemented as a line search method where the search direction is computed from a BFGS approximation to the inverse of the Hessian. The advantage of inverse updating is that the search directions are obtained by a matrix–vector multiplication. In this paper it is observed that limited memory updates to the Hessian approximatio...
متن کاملA regularized limited-memory BFGS method for unconstrained minimization problems
The limited-memory BFGS (L-BFGS) algorithm is a popular method of solving large-scale unconstrained minimization problems. Since LBFGS conducts a line search with the Wolfe condition, it may require many function evaluations for ill-posed problems. To overcome this difficulty, we propose a method that combines L-BFGS with the regularized Newton method. The computational cost for a single iterat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Applied Mathematics Letters
سال: 2002
ISSN: 0893-9659
DOI: 10.1016/s0893-9659(01)00162-8